"SLM" meaning in All languages combined

See SLM on Wiktionary

Proper name [English]

Head templates: {{en-proper noun}} SLM
  1. Initialism of Sudan Liberation Movement. Tags: abbreviation, alt-of, initialism Alternative form of: Sudan Liberation Movement Synonyms: SLA
    Sense id: en-SLM-en-name-en:Sudan_Liberation_Movement

Noun [English]

Forms: SLMs [plural]
Head templates: {{en-noun}} SLM (plural SLMs)
  1. Initialism of single–longitudinal-mode [laser] or initialism of single–longitudinal-mode laser. Tags: abbreviation, alt-of, initialism Alternative form of: single–longitudinal-mode [laser] or initialism of single–longitudinal-mode laser
    Sense id: en-SLM-en-noun-en:single_longitudinal_mode_laser Categories (other): English links with manual fragments, English links with redundant wikilinks
  2. (machine learning) Initialism of small language model. Tags: abbreviation, alt-of, initialism Alternative form of: small language model Categories (topical): Machine learning
    Sense id: en-SLM-en-noun-en:small_language_model Categories (other): English links with manual fragments, English entries with incorrect language header, Pages with 1 entry, Pages with entries Disambiguation of English entries with incorrect language header: 34 4 61 Disambiguation of Pages with 1 entry: 26 3 71 Disambiguation of Pages with entries: 29 2 69

Inflected forms

{
  "head_templates": [
    {
      "args": {},
      "expansion": "SLM",
      "name": "en-proper noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "name",
  "senses": [
    {
      "alt_of": [
        {
          "word": "Sudan Liberation Movement"
        }
      ],
      "categories": [],
      "glosses": [
        "Initialism of Sudan Liberation Movement."
      ],
      "id": "en-SLM-en-name-en:Sudan_Liberation_Movement",
      "links": [
        [
          "Sudan Liberation Movement",
          "w:Sudan Liberation Movement"
        ]
      ],
      "senseid": [
        "en:Sudan Liberation Movement"
      ],
      "synonyms": [
        {
          "word": "SLA"
        }
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ]
    }
  ],
  "word": "SLM"
}

{
  "forms": [
    {
      "form": "SLMs",
      "tags": [
        "plural"
      ]
    }
  ],
  "head_templates": [
    {
      "args": {},
      "expansion": "SLM (plural SLMs)",
      "name": "en-noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "noun",
  "senses": [
    {
      "alt_of": [
        {
          "word": "single–longitudinal-mode [laser] or initialism of single–longitudinal-mode laser"
        }
      ],
      "categories": [
        {
          "kind": "other",
          "name": "English links with manual fragments",
          "parents": [
            "Links with manual fragments",
            "Entry maintenance"
          ],
          "source": "w"
        },
        {
          "kind": "other",
          "name": "English links with redundant wikilinks",
          "parents": [
            "Links with redundant wikilinks",
            "Entry maintenance"
          ],
          "source": "w"
        }
      ],
      "glosses": [
        "Initialism of single–longitudinal-mode [laser] or initialism of single–longitudinal-mode laser."
      ],
      "id": "en-SLM-en-noun-en:single_longitudinal_mode_laser",
      "links": [
        [
          "single–longitudinal-mode",
          "w:longitudinal mode"
        ],
        [
          "single–longitudinal-mode laser",
          "w:longitudinal mode"
        ]
      ],
      "senseid": [
        "en:single longitudinal-mode laser"
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ]
    },
    {
      "alt_of": [
        {
          "word": "small language model"
        }
      ],
      "categories": [
        {
          "kind": "other",
          "name": "English links with manual fragments",
          "parents": [
            "Links with manual fragments",
            "Entry maintenance"
          ],
          "source": "w"
        },
        {
          "kind": "topical",
          "langcode": "en",
          "name": "Machine learning",
          "orig": "en:Machine learning",
          "parents": [
            "Artificial intelligence",
            "Computer science",
            "Cybernetics",
            "Computing",
            "Sciences",
            "Applied mathematics",
            "Systems theory",
            "Technology",
            "All topics",
            "Mathematics",
            "Systems",
            "Fundamental",
            "Formal sciences",
            "Interdisciplinary fields",
            "Society"
          ],
          "source": "w"
        },
        {
          "_dis": "34 4 61",
          "kind": "other",
          "name": "English entries with incorrect language header",
          "parents": [
            "Entries with incorrect language header",
            "Entry maintenance"
          ],
          "source": "w+disamb"
        },
        {
          "_dis": "26 3 71",
          "kind": "other",
          "name": "Pages with 1 entry",
          "parents": [],
          "source": "w+disamb"
        },
        {
          "_dis": "29 2 69",
          "kind": "other",
          "name": "Pages with entries",
          "parents": [],
          "source": "w+disamb"
        }
      ],
      "examples": [
        {
          "bold_text_offsets": [
            [
              516,
              520
            ]
          ],
          "ref": "2025 April 13, Stephen Ornes, “Small Language Models Are the New Rage, Researchers Say. Larger models can pull off a wider variety of feats, but the reduced footprint of smaller models makes them attractive tools”, in Wired:",
          "text": "Large language models work well because they’re so large. The latest models from OpenAI, Meta, and DeepSeek use hundreds of billions of “parameters” […] With more parameters, the models are better able to identify patterns and connections, which in turn makes them more powerful and accurate. But this power comes at a cost […] huge computational resources […] energy hogs […] In response, some researchers are now thinking small. IBM, Google, Microsoft, and OpenAI have all recently released small language models (SLMs) that use a few billion parameters—a fraction of their LLM counterparts. Small models are not used as general-purpose tools like their larger cousins. But they can excel on specific, more narrowly defined tasks, such as summarizing conversations, answering patient questions as a health care chatbot, and gathering data in smart devices. “For a lot of tasks, an 8 billion–parameter model is actually pretty good,” said Zico Kolter, a computer scientist at Carnegie Mellon University. They can also run on a laptop or cell phone, instead of a huge data center. (There’s no consensus on the exact definition of “small,” but the new models all max out around 10 billion parameters.) To optimize the training process for these small models, researchers use a few tricks. […]",
          "type": "quote"
        }
      ],
      "glosses": [
        "Initialism of small language model."
      ],
      "id": "en-SLM-en-noun-en:small_language_model",
      "links": [
        [
          "machine learning",
          "machine learning"
        ],
        [
          "small language model",
          "small language model#English"
        ]
      ],
      "qualifier": "machine learning",
      "raw_glosses": [
        "(machine learning) Initialism of small language model."
      ],
      "senseid": [
        "en:small language model"
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ]
    }
  ],
  "word": "SLM"
}
{
  "categories": [
    "English countable nouns",
    "English entries with incorrect language header",
    "English lemmas",
    "English nouns",
    "English proper nouns",
    "English uncountable nouns",
    "English words without vowels",
    "Pages with 1 entry",
    "Pages with entries"
  ],
  "head_templates": [
    {
      "args": {},
      "expansion": "SLM",
      "name": "en-proper noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "name",
  "senses": [
    {
      "alt_of": [
        {
          "word": "Sudan Liberation Movement"
        }
      ],
      "categories": [
        "English initialisms"
      ],
      "glosses": [
        "Initialism of Sudan Liberation Movement."
      ],
      "links": [
        [
          "Sudan Liberation Movement",
          "w:Sudan Liberation Movement"
        ]
      ],
      "senseid": [
        "en:Sudan Liberation Movement"
      ],
      "synonyms": [
        {
          "word": "SLA"
        }
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ]
    }
  ],
  "word": "SLM"
}

{
  "categories": [
    "English countable nouns",
    "English entries with incorrect language header",
    "English lemmas",
    "English nouns",
    "English proper nouns",
    "English uncountable nouns",
    "English words without vowels",
    "Pages with 1 entry",
    "Pages with entries"
  ],
  "forms": [
    {
      "form": "SLMs",
      "tags": [
        "plural"
      ]
    }
  ],
  "head_templates": [
    {
      "args": {},
      "expansion": "SLM (plural SLMs)",
      "name": "en-noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "noun",
  "senses": [
    {
      "alt_of": [
        {
          "word": "single–longitudinal-mode [laser] or initialism of single–longitudinal-mode laser"
        }
      ],
      "categories": [
        "English initialisms",
        "English links with manual fragments",
        "English links with redundant wikilinks"
      ],
      "glosses": [
        "Initialism of single–longitudinal-mode [laser] or initialism of single–longitudinal-mode laser."
      ],
      "links": [
        [
          "single–longitudinal-mode",
          "w:longitudinal mode"
        ],
        [
          "single–longitudinal-mode laser",
          "w:longitudinal mode"
        ]
      ],
      "senseid": [
        "en:single longitudinal-mode laser"
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ]
    },
    {
      "alt_of": [
        {
          "word": "small language model"
        }
      ],
      "categories": [
        "English initialisms",
        "English links with manual fragments",
        "English terms with quotations",
        "en:Machine learning"
      ],
      "examples": [
        {
          "bold_text_offsets": [
            [
              516,
              520
            ]
          ],
          "ref": "2025 April 13, Stephen Ornes, “Small Language Models Are the New Rage, Researchers Say. Larger models can pull off a wider variety of feats, but the reduced footprint of smaller models makes them attractive tools”, in Wired:",
          "text": "Large language models work well because they’re so large. The latest models from OpenAI, Meta, and DeepSeek use hundreds of billions of “parameters” […] With more parameters, the models are better able to identify patterns and connections, which in turn makes them more powerful and accurate. But this power comes at a cost […] huge computational resources […] energy hogs […] In response, some researchers are now thinking small. IBM, Google, Microsoft, and OpenAI have all recently released small language models (SLMs) that use a few billion parameters—a fraction of their LLM counterparts. Small models are not used as general-purpose tools like their larger cousins. But they can excel on specific, more narrowly defined tasks, such as summarizing conversations, answering patient questions as a health care chatbot, and gathering data in smart devices. “For a lot of tasks, an 8 billion–parameter model is actually pretty good,” said Zico Kolter, a computer scientist at Carnegie Mellon University. They can also run on a laptop or cell phone, instead of a huge data center. (There’s no consensus on the exact definition of “small,” but the new models all max out around 10 billion parameters.) To optimize the training process for these small models, researchers use a few tricks. […]",
          "type": "quote"
        }
      ],
      "glosses": [
        "Initialism of small language model."
      ],
      "links": [
        [
          "machine learning",
          "machine learning"
        ],
        [
          "small language model",
          "small language model#English"
        ]
      ],
      "qualifier": "machine learning",
      "raw_glosses": [
        "(machine learning) Initialism of small language model."
      ],
      "senseid": [
        "en:small language model"
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ]
    }
  ],
  "word": "SLM"
}

Download raw JSONL data for SLM meaning in All languages combined (3.9kB)


This page is a part of the kaikki.org machine-readable All languages combined dictionary. This dictionary is based on structured data extracted on 2025-04-26 from the enwiktionary dump dated 2025-04-20 using wiktextract (89e900c and ea19a0a). The data shown on this site has been post-processed and various details (e.g., extra categories) removed, some information disambiguated, and additional data merged from other sources. See the raw data download page for the unprocessed wiktextract data.

If you use this data in academic research, please cite Tatu Ylonen: Wiktextract: Wiktionary as Machine-Readable Structured Data, Proceedings of the 13th Conference on Language Resources and Evaluation (LREC), pp. 1317-1325, Marseille, 20-25 June 2022. Linking to the relevant page(s) under https://kaikki.org would also be greatly appreciated.